1,365 research outputs found

    Assessing infrequent large earthquakes using geomorphology and geodesy in the Malawi Rift

    Get PDF
    In regions with large, mature fault systems, a characteristic earthquake model may be more appropriate for modelling earthquake occurrence than extrapolating from a short history of small, instrumentally observed earthquakes using the Gutenberg–Richter scaling law. We illustrate how the geomorphology and geodesy of the Malawi Rift, a region with large seismogenic thicknesses, long fault scarps, and slow strain rates, can be used to assess hazard probability levels for large infrequent earthquakes. We estimate potential earthquake size using fault length and recurrence intervals from plate motion velocities and generate a synthetic catalogue of events. Since it is not possible to determine from the geomorphological information if a future rupture will be continuous (7.4 ≤ M W ≤ 8.3 with recurrence intervals of 1,000–4,300 years) or segmented (6.7 ≤ M W ≤ 7.7 with 300–1,900 years), we consider both alternatives separately and also produce a mixed catalogue. We carry out a probabilistic seismic hazard assessment to produce regional- and site-specific hazard estimates. At all return periods and vibration periods, inclusion of fault-derived parameters increases the predicted spectral acceleration above the level predicted from the instrumental catalogue alone, with the most significant changes being in close proximity to the fault systems and the effect being more significant at longer vibration periods. Importantly, the results indicate that standard probabilistic seismic hazard analysis (PSHA) methods using short instrumental records alone tend to underestimate the seismic hazard, especially for the most damaging, extreme magnitude events. For many developing countries in Africa and elsewhere, which are experiencing rapid economic growth and urbanisation, seismic hazard assessments incorporating characteristic earthquake models are critical

    Atmospheric electric charge transfer in precipitation and associated synoptic conditions

    Get PDF
    Measurements of Atmospheric Electricity have been made in the unpolluted air of Weardale during conditions of precipitation and in fair weather. An automatic recording system has been built to digitize instrument outputs on paper-tape for subsequent computer analysis. The system ivas installed and run at Lanehead Field Centre and was also used to process magnetic tape recordings from the LandRover mobile station. The system was expanded to include an 1-hour smoothing and sampling action for recording aveiaged values of fair weather Atmospheric Electricity. At times of electrically quiet precipitation, measurements have been made of potential gradient, precipitation current density, space charge density and both polar conductivities. A new method of compensation for displacement currents has been used. Conductivity measurements have revealed a charge separation process close to the ground in rain, but not in snow. Techniques of variance spectrum analysis have been adopted for the precipitation work. Coherency spectra of potential gradient with precipitation current have indicated electrical 'cells' in nimbostratus and their relevance to weather forecasting is discussed. The phase spectra for these two parameters have been examined to measure the height of electrical activity and this is found to coincide with the melting level, and an estimate is made of the conductivity of the charging region of the cloud. Digital filtering of records has disclosed a mechanical-transfer current of space charges, to an exposed rain receiver, opposite to the precipitation current. The diurnal variation of potential gradient at Lanehead has been refined with a further year's continuous observations in fair weather and seasonal differences in the diurnal variations of potential gradient, air- earth -current density and space charge density have been explained by increased convection in summer. The conduction current has been estimated, by the indirect method and the difference between this and the total air-earth current to an exposed plate is attributed to a mechanical-transfer current of space charges. Measurements in light winds have evinced the influence of the electrode effect

    How effective is the Forestry Commission Scotland's woodland improvement programme--'Woods In and Around Towns' (WIAT)--at improving psychological well-being in deprived urban communities? A quasi-experimental study

    Get PDF
    Introduction: There is a growing body of evidence that suggests that green spaces may positively influence psychological well-being. This project is designed to take advantage of a natural experiment where planned physical and social interventions to enhance access to natural environments in deprived communities provide an opportunity to prospectively assess impacts on perceived stress and mental well-being.<p></p> Study design and methods: A controlled, prospective study comprising a repeat cross-sectional survey of residents living within 1.5 km of intervention and comparison sites. Three waves of data will be collected: prephysical environment intervention (2013); postphysical environment intervention (2014) and postwoodland promotion social intervention (2015). The primary outcome will be a measure of perceived stress (Perceived Stress Scale) preintervention and postintervention. Secondary, self-report outcomes include: mental well-being (Short Warwick-Edinburgh Mental Well-being Scale), changes in physical activity (IPAQ-short form), health (EuroQoL EQ-5D), perception and use of the woodlands, connectedness to nature (Inclusion of Nature in Self Scale), social cohesion and social capital. An environmental audit will complement the study by evaluating the physical changes in the environment over time and recording any other contextual changes over time. A process evaluation will assess the implementation of the programme. A health economics analysis will assess the cost consequences of each stage of the intervention in relation to the primary and secondary outcomes of the study.<p></p> Ethics and dissemination: Ethical approval has been given by the University of Edinburgh, Edinburgh College of Art Research, Ethics and Knowledge Exchange Committee (ref. 19/06/2012). Findings will be disseminated through peer-reviewed publications, national and international conferences and, at the final stage of the project, through a workshop for those interested in implementing environmental interventions.<p></p&gt

    Risk and uncertainty assessment of volcanic hazards

    Get PDF

    Digital pulse-shape discrimination of fast neutrons and gamma rays

    Full text link
    Discrimination of the detection of fast neutrons and gamma rays in a liquid scintillator detector has been investigated using digital pulse-processing techniques. An experimental setup with a 252Cf source, a BC-501 liquid scintillator detector, and a BaF2 detector was used to collect waveforms with a 100 Ms/s, 14 bit sampling ADC. Three identical ADC's were combined to increase the sampling frequency to 300 Ms/s. Four different digital pulse-shape analysis algorithms were developed and compared to each other and to data obtained with an analogue neutron-gamma discrimination unit. Two of the digital algorithms were based on the charge comparison method, while the analogue unit and the other two digital algorithms were based on the zero-crossover method. Two different figure-of-merit parameters, which quantify the neutron-gamma discrimination properties, were evaluated for all four digital algorithms and for the analogue data set. All of the digital algorithms gave similar or better figure-of-merit values than what was obtained with the analogue setup. A detailed study of the discrimination properties as a function of sampling frequency and bit resolution of the ADC was performed. It was shown that a sampling ADC with a bit resolution of 12 bits and a sampling frequency of 100 Ms/s is adequate for achieving an optimal neutron-gamma discrimination for pulses having a dynamic range for deposited neutron energies of 0.3-12 MeV. An investigation of the influence of the sampling frequency on the time resolution was made. A FWHM of 1.7 ns was obtained at 100 Ms/s.Comment: 26 pages, 14 figures, submitted to Nuclear Instruments and Methods in Physics Research

    Understanding causality and uncertainty in volcanic observations: an example of forecasting eruptive activity on Soufrière Hills Volcano, Montserrat

    Get PDF
    Following a cessation in eruptive activity it is important to understand how a volcano will behave in the future and when it may next erupt. Such an assessment can be based on the volcano's long-term pattern of behaviour and insights into its current state via monitoring observations. We present a Bayesian network that integrates these two strands of evidence to forecast future eruptive scenarios using expert elicitation. The Bayesian approach provides a framework to quantify the magmatic causes in terms of volcanic effects (i.e., eruption and unrest). In October 2013, an expert elicitation was performed to populate a Bayesian network designed to help forecast future eruptive (in-)activity at Soufrière Hills Volcano. The Bayesian network was devised to assess the state of the shallow magmatic system, as a means to forecast the future eruptive activity in the context of the long-term behaviour at similar dome-building volcanoes. The findings highlight coherence amongst experts when interpreting the current behaviour of the volcano, but reveal considerable ambiguity when relating this to longer patterns of volcanism at dome-building volcanoes, as a class. By asking questions in terms of magmatic causes, the Bayesian approach highlights the importance of using short-term unrest indicators from monitoring data as evidence in long-term forecasts at volcanoes. Furthermore, it highlights potential biases in the judgements of volcanologists and identifies sources of uncertainty in terms of magmatic causes rather than scenario-based outcomes

    Alternative Covid-19 mitigation measures in school classrooms:analysis using an agent-based model of SARS-CoV-2 transmission

    Get PDF
    The SARS-CoV-2 epidemic has impacted children's education, with schools required to implement infection control measures that have led to periods of absence and classroom closures. We developed an agent-based epidemiological model of SARS-CoV-2 transmission in a school classroom that allows us to quantify projected infection patterns within primary school classrooms, and related uncertainties. Our approach is based on a contact model constructed using random networks, informed by structured expert judgement. The effectiveness of mitigation strategies in suppressing infection outbreaks and limiting pupil absence are considered. COVID-19 infections in primary schools in England in autumn 2020 were re-examined and the model was then used to estimate infection levels in autumn 2021, as the Delta variant was emerging and it was thought likely that school transmission would play a major role in an incipient new wave of the epidemic. Our results were in good agreement with available data. These findings indicate that testing-based surveillance is more effective than bubble quarantine, both for reducing transmission and avoiding pupil absence, even accounting for insensitivity of self-administered tests. Bubble quarantine entails large numbers of absences, with only modest impact on classroom infections. However, maintaining reduced contact rates within the classroom can have a major benefit for managing COVID-19 in school settings

    Isabelle/PIDE as Platform for Educational Tools

    Full text link
    The Isabelle/PIDE platform addresses the question whether proof assistants of the LCF family are suitable as technological basis for educational tools. The traditionally strong logical foundations of systems like HOL, Coq, or Isabelle have so far been counter-balanced by somewhat inaccessible interaction via the TTY (or minor variations like the well-known Proof General / Emacs interface). Thus the fundamental question of math education tools with fully-formal background theories has often been answered negatively due to accidental weaknesses of existing proof engines. The idea of "PIDE" (which means "Prover IDE") is to integrate existing provers like Isabelle into a larger environment, that facilitates access by end-users and other tools. We use Scala to expose the proof engine in ML to the JVM world, where many user-interfaces, editor frameworks, and educational tools already exist. This shall ultimately lead to combined mathematical assistants, where the logical engine is in the background, without obstructing the view on applications of formal methods, formalized mathematics, and math education in particular.Comment: In Proceedings THedu'11, arXiv:1202.453
    corecore